On the convergence rate of Fletcher?Reeves nonlinear conjugate gradient methods satisfying strong Wolfe conditions: Application to parameter identification in problems governed by general dynamics

نویسندگان

چکیده

Over the last decades, many efforts have been made toward understanding of convergence rate gradient-based method for both constrained and unconstrained optimization. The cases strongly convex weakly payoff function extensively studied are nowadays fully understood. Despite impressive advances in optimization context, nonlinear non-convex problems still not exploited. In this paper, we concerned with nonlinear, problem under system dynamic constraints. We apply our analysis to parameter identification systems governed by general differential equations. considered inverse is presented using optimal control tools. tackle through use Fletcher-Reeves conjugate gradient satisfying strong Wolfe conditions inexact line search. rigorously establish a report new linear which forms main contribution work. theoretical result reported requires that second derivative functional be continuous bounded. Numerical evidence on selection popular models as direct application support findings.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

and Applied Analysis 3 = ‖d k−1 ‖2 ‖g k−1 ‖4 + 1 ‖g k ‖2 − β2 k (gT k d k−1 ) 2 /‖g k ‖4

متن کامل

Application of frames in Chebyshev and conjugate gradient methods

‎Given a frame of a separable Hilbert space $H$‎, ‎we present some‎ ‎iterative methods for solving an operator equation $Lu=f$‎, ‎where $L$ is a bounded‎, ‎invertible and symmetric‎ ‎operator on $H$‎. ‎We present some algorithms‎ ‎based on the knowledge of frame bounds‎, ‎Chebyshev method and conjugate gradient method‎, ‎in order to give some‎ ‎approximated solutions to the problem‎. ‎Then we i...

متن کامل

A three-parameter family of nonlinear conjugate gradient methods

In this paper, we propose a three-parameter family of conjugate gradient methods for unconstrained optimization. The three-parameter family of methods not only includes the already existing six practical nonlinear conjugate gradient methods, but subsumes some other families of nonlinear conjugate gradient methods as its subfamilies. With Powell’s restart criterion, the three-parameter family of...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Methods in The Applied Sciences

سال: 2021

ISSN: ['1099-1476', '0170-4214']

DOI: https://doi.org/10.1002/mma.8009